課程資訊
課程名稱
圖書館與資訊系統評估
Library and Information System Evaluation 
開課學期
103-2 
授課對象
文學院  圖書資訊學研究所  
授課教師
唐牧群 
課號
LIS5067 
課程識別碼
126 U1360 
班次
 
學分
全/半年
半年 
必/選修
選修 
上課時間
星期四5,6,7(12:20~15:10) 
上課地點
圖資會議室 
備註
U選課程,學士班與碩士班同學均可選修。
總人數上限:17人 
課程網頁
http://www.lis.ntu.edu.tw/~mctang/courses/evaluation/index.html 
課程簡介影片
 
核心能力關聯
核心能力與課程規劃關聯圖
課程大綱
為確保您我的權利,請尊重智慧財產權及不得非法影印
課程概述

This class examines goals, criteria, measures, measuring instruments and methods of evaluation a avariety of library and information services. Emphasis is placed on the use of data for planning and decision making purpose. Topics covered include evaluation of collection, applied bibliometrics, availability analysis, experimental designm protocol analysis, focus group and survey interview. 

課程目標
Students will
1. Acquire a basic understanding of the methods available for evaluating library collection and information services;
2. Identify the most appropriate method for a particular evaluation project;
3. Design and implement an evaluation project; and
4. Understand the opportunities and challenges of evaluation in a networked environment  
課程要求
Class participation: 10% of your final grade
Will be evaluated on your attendance and participation (i.e. the questions you raise and answers you give in the class).
Group projects:
*For undergraduate students:
Students will form into groups of 2 to 3 to conduct all four assignments.
*For graduate students:
Graduate students will work in groups for the first (usability study review), second (usability testing), and the third assignments (user study review) ; for the forth assignments (the evaluation research proposal), each will works independently.
*For all the group projects, besides the group report, each individual will also write a half page personal report on his/her contributions and reflections on the assignment.
1.Usability study review from Tullis and Albert : 10%
Each group will report one of the 6 usability projects listed in Chapter 10 of Tullis and Albert.
2. Empirical usability testing exercise: 30%
For this assignment, each group will conduct an empirical usabililty study of an on line information resource of your choosing.
Definition of an online service:
An online service can be a library web site, library OPAC, bibliographic database, or e-commerce/government site. The user study can be either an IR performance evaluation or interface usability test.
Research participants:
The study should include at least 2 participants, who are potential users of the online service.
Tasks:
The participants will be asked to perform tasks (genuine or assigned) so the usability of the site can be evaluated. Their online activities will be recorded using screen capture software such as morae. example
Report:
You will write a 4-6 page paper to report your methodology and findings and present it to the class. In your finding, try to point out the usability issues of the site and make recommedations on how it can be improved.
Detailed instructions for usability exercise
3. User study article review: 20%
Students will work in groups of 2 to 3 to present one evaluation study (See below for the list of articles reviewed). This assignment is designed to help you familiarize with the actual user study procedures. You should therefore focus mainly on the methodologies used in these studies. Specifically, the review should consist of the six basic elements of an evaluation research:
1. Construct
2. Criteria
3. Measures
4. Measuring instruments
5. Methodology
6. Strengths and potential flaws (e.g. threats to external or internal validity)
No written report for this assignment, give a 20 minutes power point presentation of your review and turn in the ppt file.
Survey study
*Wixom, B. H. and Todd, P.A. (2006). A theoretical integration of user satisfaction and technology accpetance. Information Systems Research. 16(1), pp. 85-102.
*Nitecki, Danuta A. and Hernon, Peter (2000). "Measuring service quality at Yale University's libraries." Journal of Academic Librarianship v.26 n.4,p.259-73.
Webmetic study
*Fang Wei (2007). Using Google Analytics for imporving library website content and design: a case study. Library Philosophy and Practice 2007.
Experimental study
*Koenemann, J., & Belkin, N. J. (1996). A case for interaction: a study of interactive information retrieval behavior and effectiveness. In Proceedings of the ACM SIGCHI conference on human factors in computer systems, pp. 205–212.
*Dumais, s., et al, “Stuff I've Seen: A System for Personal Information Retrieval and Re-Use,” Microsoft Research, SIGIR'03, Jul. 28-Aug. 1, 2003, pp. 1-8. pdf
*Joho, H and Jose, J.M. (2006). Slicing and dicing the information space using local contexts. Information Interaction in Context.
Qualitative methodology
*Karapanos, E., Zimmerman, J. and Martens, J. 2009. User expereince over time: an Initial framework.CHI 2009, User expereince
*Strategies for Evaluating Information Visualization Tools: Multi-dimensional In-depth Long-term Case Studies. Ben Shneiderman and Catherine Plaisant, Proc. AVI Workshop on BEyond time and errors: novel evaLuation methods for Information Visualization (BELIV), 2006, p 38--43.

4. Final project: 30%
Each group (for graduate students, each individual) will write a evaluation research proposal due at the end of the semester. As the forcus of the study will be the methodology section, a lengthy literature review is not requried. It will be a user study that consists of the following components (See Hernon & McClure, 1990; Chap. 4):
a. Background and problem statement (1-2 pages)
b. Study objectives (1-2 pages)
c. Research Questions (1-2 pages)
d. Research procedures (methodology, design, instruments) (4-8 pages)
e. Expected difficulties (1-2) pages
f . Presentation of the project to the class
*Not graded assignments: progress reports
Each student with give 3 progress reports throughout the semester (due dates see course calendar). This is designed so that you can get feedbacks from the instructor and the class for different components of your final project.
1.Report 1, background and problem statement (within one page)
2.Report 2, research questions and design (within one page)
3.Report 3, pre-test draft instruments (as long as it takes) 
預期每週課後學習時數
 
Office Hours
 
指定閱讀
 
參考書目
Bertot John Carlo (2004). Assessing digital library services: approaches,
issues, and considerations available online
International Symposium on Digital Libraries and Knowledge Communities in
Networked Information Society 2004 (DLKC'04).

Battleson, Brenda, Austin Booth & Jane Weintrop (2001). Usability testing of an
academic library Web sites: a case study. Usability testing of an academic
library Web site: a case study. The Journal of Academic Librarianship, 27, 188-
198.

Ciliberti Anne, Marie L. Radford, Gary P. Radford, and Terry Ballard (1998).
Empty handed? a material avaiability study and transaction log analysis
verification. The Journal of Academic Librarianship (24)4, pp. 282-9.

Cook, Colleen & Fred M. Heath (2001a). Users' perceptions of library service
quality: a LibQUAL+ qualitative study.

Cook, Colleen, Fred M. Heath (2001b). LibQUAL+: Service quality assessment. IFLA
conference proceedings available online

Cook, Colleen, Fred Heath and Bruce Thompson (2000). A new culture of
assessment: preliminary report on the ARL SERVQUAL survey. IFLA conference
proceedings 66, available online

Covey, Denise. Troll (2002). Usage and usability assessment : library practices
and concerns. Washington :, Digital Library Federation, Council on Library and
Information Resources. 93p available online

Nitecki, Danuta A. and Hernon, Peter. "Measuring service quality at Yale
University's libraries."Journal of Academic Librarianship v.26 n.4 (Jul 2000)
p.259-73.

Hart, J., C. Ridley, F. Taher, C. Sas, A. Dix (2008). Exploring the Facebook
Experience: a new approach to usability. NordiCHI, 2008.

Hearsh Wiliam R. (2003). Information retrieval: a health and biomedical
perspective. New York: Springer.

Jonathan L. Herlocker , Joseph A. Konstan , Loren G. Terveen , John T. Riedl,
Evaluating collaborative filtering recommender systems, ACM Transactions on
Information Systems (TOIS), v.22 n.1, p.5-53, January 2004
[doi>10.1145/963770.963772]

Hernon, Peter & Charles R. McClure (1990). Evaluation & Library Decision Making.
Norwood, N.J.: Ablex.

Hassenzahl and Tractinsky, 2006. User experience - a research agenda . Behaviour
and Information Technologyy, V.25/2, pp.91-97.
Hiller, Steve (2001). Assessing user needs, satisfaction, and libary performance
at the University of Washington Libraries. Library Trends 49(4). pp. 605-625.

Hiller, Steve, and James Self (2004). From measurement to management: using data
wisely from planning and decision-making. Library Trends (53)1, pp. 129-155.

Ho jeannette & Gwyneth H. Crowley (2003). User perceptions of the "reliability"
of library services at Texa A&M University: a focus group study. Journal of
Academic Librarianship (29)2, pp.82-87

Ke, Hao-Ren, Rolf Kwakkelaar, Yu-Min Tai, & Li-Chun Chen (2002). Exploring
behavior of e-journal users in science and technology: transaction log analysis
of Elsevier's ScienceDirect OnSite in Taiwan. Library and Information Science
Research 24,pp. 265-291.

Kantor (1984). Objective performance measures for academic and research
libraries.
Keppel, G., & Wickens, T. A. (2004). Design and analysis: A researcher's
handbook (4th ed.). Upper Saddle River, NJ: Prentice-Hall.

Kyrillidou, Martha (2002). From input and output measures to quality and outcome
measures, or, from the user in the life of the library to the library in the
life of the user.

Lin Tom M. Y., Luarn Pin and Yun Kuei Huang (2005). Effect of Internet book
reviews on purchase intention: a focus group study. Journal of Academic
Librarianship 31(5), pp. 461-468.

Lancaster, F.W. (1993). If you want to evaluate your library. London : Library
Association.

Law, Effie Lai-Chong, Virpi Roto, Marc Hassenzahl, Arnold P.O.S. Vermeeren, and
Joke Kort. 2009. Understanding, scoping and defining user experience: a survey
approach. In Proceedings of the 27th international conference on Human factors
in computing systems (CHI '09). ACM, New York, NY, USA, 719-728.

Pesch, Oliver (2004). Using statistics: taking e-metrics to the next level. The
Serials Librarian 46(1/2),pp.143-154.

Siegel et al. (1991)Evaluating the impact of MEDLINE using the Critical Incident
Technique. Proc Annu Symp Comput Appl Med Care. 1991: 83–87.
Taylor, R.S. (1984). Value-added process in information systems. Norwood, NJ:
Ablex.
Matthews, J. R. (2007). The Evaluation and Measurement of Library Services.
Westport, CN: Libraries Unlimited.  
評量方式
(僅供參考)
   
課程進度
週次
日期
單元主題